From:                              route@monster.com

Sent:                               Tuesday, June 04, 2013 3:54 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Big Data

 

This resume has been forwarded to you at the request of Monster User xapeix01

Somesh M 

Last updated:  06/04/13

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Coatesville  19320
US

Home: 610-643-4701   
msomesh.hadoop@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Hadoop Developer

Resume Value: p67ba2suh53d2th3   

  

 

                                               Email: msomesh.hadoop@gmail.com 

                                                                                    Ph: 610-643-4701

 

Somesh Manda

 

Objective Assist Business Decision-Makers in understanding current activities and trends to make Informed Actionable Decisions.

 

Summary

·         Fourteen years of software life cycle experience in ETL, Data Warehouse, Business Intelligence and Client Server applications.

·         Five years of Data Warehousing ETL Team Leadership. Project tracking using MS Project. Kimball methodology.

·         Nine years of strong experience in designing and implementing Data Warehouses using
Informatica PowerCenter / PowerMart v4 through v8.1, Cognos v6, Business Objects v5i, ERwin 4.0.

·         Experience in Big Data, Hadoop technologies

·         Involved in capacity planning activities inlcude storge, CPU and Memory usage.

·         Mentor other developers.

·         Proven skills: SQL, Oracle v8i-9i-10g, Sybase, MS SQL Server 2000, DB2 UDB, MS Access,
PL/SQL, SQL*Plus, SQL*Loader, Unix Shell Scripting, Business Objects Crystal Reports, Oracle Designer, Supervisor, Web Intelligence.

·         Extensive experience in gathering and documenting business requirements; functional requirements; conducting source data analysis; solution design, development and implementation, data extraction, transformation and loading (ETL); and testing of Data Warehouse solutions.

·         Extensive experience in designing analytical / OLAP and transactional / OLTP databases.
Proficient using ERwin to design backend data models and entity relationship diagrams (ERDs)
for star schemas, snowflake dimensions and fact tables.

·         Worked on CDH, CDM, Hadoop environment (HDFS) setup, Map Reduce Jobs, HIVE, Hbase, PIG and NoSQL and MongoDB.

·         Worked on Multi Clustered environment and setting up Cloudera Hadoop echo-System.

Education

·         Master of Science; Computer Science; Andhra University.

·         Bachelor of Science; Mathematics and Statistics; Andhra University.

Technical Skills

Data Warehousing: Informatica PowerMart 4.7/5.0/6.2/7.0, Informatica Power Center 4.1/5.0/5.1/6.2/7.0/8.1.6/9.01, ETL Informatica PowerConnect.

 

Hadoop Ecosystem Development              Hadoop HDFS, Sun Grid Engine Administration, HIVE, PIG, Flume, Oozie, Zookeeper, HBASE and Sqoop.

 

Databases: Oracle 7.x/8.x/9i, MS SQL Server 6.5/7.0/2000, MS Access, DB2 UDB, Sybase,

 

OLAP Tools: Business Objects Designer 5.1, BO Supervisor, BO Reports, COGNOS Powerplay, Powerplay Transformer, Impromptu administrator and Impromptu.

 

Data Modeling ER/Studio 3.5/4.0, Erwin 3.5/4.0.

 

Languages Visual Basic 6.0, Visual Studio .NET, ASP, HTML, XML, Java Script, PL/SQL.

 

CRM Siebel Tools eScript/Siebel VB, eBusiness Application Integration (EAI), Actuate Reports, Workflow Manager, Assignment Manager, EIM and Smart Scripts.

 

Other Software VB.Net, ASP.Net, Perl, Microsoft Internet Information Server (IIS) 5.0, Microsoft Transaction Server (MTS), TOAD, SQL Station, SQL Navigator, Relational clearest, Crystal Reports 6.x/7.x., MS Windows NT/2000/98/95/XP, MS Office Professional, Solaris, Unix, Linux, MS-DOS.

Professional Experience

CTS, Wilmington, DE Jan ’12 – TilDate

ETL-Hadoop Architect/Developer

Hadoop project: Started with POC and also involved with the ongoing development. Project Description: Large amount of high volume Data from bidders and other networks was streamed daily to S3 Amazon platform from S3 platform, the Data was loaded into Hadoop Cluster. This Data was maintained for 60 days for analysis purposes600-700 GB of Data was loaded daily into the Cluster

 

Major Responsibilities:

·         Created Java classes for AVRO file formats.

·         Worked with Maven for code migration.

·         Worked with GitHub to perform the version control.

·         Worked with crucible for code reviews.

·         Worked implementing third party libraries like Dozer, UtilBeans and Reflection.

·         Used Eclipse to write Java and Mapreduce programs.

·         Worked as ETL Architect to make sure all the applications are migrated (along with server) smoothly.

·         Deep understanding and related experience with Hadoop stack - internals, HBase, Hive, Pig and Map/Reduce

·         Deep understanding of schedulers, workload management, availability, scalability and distributed data platforms

·         Expert knowledge developing and debugging in Java/J2EE

·         Wrote Hive Queries and UDF’s.

·         Wrote MapReduce jobs.

·         Implemented Fair schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users.

·         Upgrading the Hadoop Cluster from CDH3 to CDH4 and setup High availability Cluster Integrate the HIVE with existing applications

·         Automated all the jobs starting from pulling the Data from different Data Sources like MySQL to pushing the result set Data to Hadoop Distributed File System.

·         Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.

·         Specifying the Cluster size, allocating Resource pool, Distribution of Hadoop by writing the specification texts in JSON File format.

·         Configured Ethernet bonding for all Nodes to double the network bandwidth

·         Automated all the jobs starting from pulling the Data from different Data Sources like MySQL to pushing the result set Data to Hadoop Distributed File System.

·         Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.

·         Exported the result set from HIVE to MySQL using Shell scripts.

·         Develop HIVE queries for the analysts.

·         Helped the team to increase Cluster from 25 Nodes to 40 Nodes.

·         Wrote Nagios plugins to monitor Hadoop Name Node Health status, number of Task trackers running, number of Data Nodes running.

·         Maintain System integrity of all sub-components (primarily HDFS, MR, HBase, and Flume).

·         Monitor System health and logs and respond accordingly to any warning or failure conditions.

·          

Environment: Hadoop, HDFS, MapReduce, Hive, Pig, Eclipse, Java, AVRO, GitHub, Maven, Scoop, Oracle 9i/10g, SQL Server. MySQL, ER/Studio 3.5/4.0, UNIX Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0, SQL Sever Management Studio, MySQL WorkBench. Informatica PowerCenter 9.0.1, Data Quality 9.0.1, Netezza 4.x

Shire Pharma, ChesterBrook, PA Jan ’11 – Dec’11

Informatica Architect/Lead

Aggregate Spend is the process used to aggregate and monitor total amount spent by healthcare manufacturers on individual Healthcare Professionals and Organizations (HCP/O) through payments, gifts, honoraria, travel and other means. Also often referred to as the Physician Spend Sunshine Law, this initiative is a growing body of Federal and State legislations that collectively address all or some of the following goals: (a) Provide transparency with regard to who, in the life sciences industry, is contributing what benefits to which physician; (b) Mandate statutory reports at least once a year; and, (c) Limit spend per physician. “Aggregate Spend” is the total, collective, cumulative amount spent by healthcare manufacturers (pharmaceutical, biotechnology and medical device organizations) on individual Healthcare Professionals and Organizations (HCP/O) through payments, gifts, honoraria, travel and other means.[1] Organizations monitored include (Pharmaceutical, Biotechnology and, in some states, Medical Device organizations).

Group Practice: Worked on building the relationship hierarchy for HCOS and plantrak affiliations.

CADI: Built the data ware house for corporate accounts. Worked with Managed Care, CARS and Medicare data.

Veeva: Lead the project to replace Siebel CRM system with Veeva.

Major Responsibilities:

 

·         Prepared the check list to upgrade the existing informatica 8.1.6 to informatica 9.0.1.

·         Upgrade the Informatica server and client tools.

·         Manages and mentors a team of five ETL developers.

·         Involves in making Architecture decisions and technical road maps.

·         Establish coding standards, migration methodology and naming conventions.

·         Communicates with business users to gather requirements.

·         Preparing high and low level ETL design documents.

·         Created NZLoad and NZSQL frameworks for Netezza.

·         Work with the team to set up their goals and do the performance reviews.

·         Talk to the external and internal vendors about the data feeds.

·         Delivered project from beginning to end.

·         Built ETL for calculate the TOT for Veeva and for other call data.

·         Involve in decision making of the technical tools.

·         Prepare project plan and assign tasks to the direct reports. Work with them to accomplish their tasks on time.

·         Promoting informatica processes from Dev to QA and from QA to Production.

·         Worked on identifying and setting up proper distribution keys.

·         Good knowledge on Netezza architecture like Zone maps, distribution keys etc

·         Created design documents for ETL mappings.

·         Involved in gathering requirements.

·         Coordinate with Oracle DBA and Unix Admin to achieve better dB performance and identify performance bottlenecks.

 

Environment: Informatica PowerCenter 9.0.1, Data Quality 9.0.1, Netezza 4.x, Oracle 9i/10g, SQL Server. MySQL, ER/Studio 3.5/4.0, UNIX Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0, SQL Sever Management Studio, MySQL WorkBench.

AstraZeneca Wilmington, DE Oct ’09 – Dec ‘10

Informatica-Netezza ETL Architect/Lead

This project is to migrate the existing Cornerstone data warehouse from Oracle to Netezza.

 

Major Responsibilities:

 

·         Review the existing informatica mappings and create design documents to migrate them to Netezza.

·         Converted the existing XPONENT process into ELT to achieve good performance. Due to converting to ETL the load time was reduced to 4 hours from 70 hours.

·         Converted Oracle Materialized views into Netezza views.

·         Worked on performance and tuning netezza queries.

·         Worked on identifying and setting up proper distribution keys.

·         Good knowledge on Netezza architecture like Zone maps, distribution keys etc

·         Created design approach to lift and shift the existing mappings to Netezza.

·         Created design documents to convert the existing mappings to use informaitca pushdown optimization.

·         Analyze the impact on the downstream systems and recommend the solutions to keep them intact.

·         Planning the Dev, SIT and QA environments.

·         Involved in designing the D/W using Star Schema. Identifying the Fact, Dimension and slowly changing dimension tables.

·         Taking ETL architecture decisions.

·         Created mappings, WorkFlows/Worklets and scheduled them using workflow manager and UNIX.

·         Create stored procedures in Netezza.

·         Converted the existing Oracle materialized and relational views into Netezza views.

·         Identify issues, debug and resolving issues.

·         Developed reusable frameworks for DB constraints and NZLoad.

·         Coordinate with Oracle DBA and Unix Admin to achieve better dB performance and identify performance bottlenecks.

 

NZ Administrator Tasks:

·         Creating and maintaining databases.

·         Create users, user groups and assigning permissions.

·         Implemented workload management on NZ.

·         Assisting development team with performance and tuning.

·         Use NZMigrate to load data from netezza server to the other server.

·         Extensively used PG.log to see the activity on Netezza.

 

 

Environment: Informatica PowerCenter 8.1, ETL, MicroStrategy, Netezza 4.x, Oracle 9i/10g, ER/Studio 3.5/4.0, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0.

Platform-A/AOL, Baltimore, MD Aug ’07 – Oct ‘09

ETL Architect/Lead

ETL Team Lead for the various projects including Advertising, War Room and other projects. Projects include in migrating from Business Objects Data Integrator to Informatica.

 

Major Responsibilities:

 

·         Involved in re designing and re writing the exiting War Room Reporting project.

·         Managed a team of ETL developers and Database Administrators.

·         Established the QA and DEV environments.

·         Worked with Siebel databases and Netezza.

·         Managed projects to move from BODI to Informatica.

·         Hands-On experience with both BODI and Informatica.

·         Installed and configured informatica 8.1 on Unix Servers.

·         Scheduling work for project team activities and gathering requirements from the business users.

·         Creates and maintains the overall and detailed project plan(s) and supervise the D/W ETL processes.

·         Involved in designing the D/W using Star Schema. Identifying the Fact, Dimension and slowly changing dimension tables.

·         Taking ETL architecture decisions.

·         Involved in designing the ETL processes and writing he design documents.

·         Created WorkFlows/Worklets and scheduled them using workflow manager and Unix.

·         Wrote numerous Pre and Post session SQL and Unix Scripts.

·         Coordinate with Oracle DBA and Unix Admin to achieve better dB performance and identify performance bottlenecks.

 

Environment: Informatica PowerCenter 8.1, ETL, Business Objects, Business Objects Data Integrator 11.5, Netezza 3.0, Oracle 8i/9i, ER/Studio 3.5/4.0, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0.

 

ING DIRECT, Wilmington, DE Dec ’03 – Jul ‘07

Technical/Team Lead

ETL Team Lead for the enterprise data warehouse containing all data pertaining to Deposits, CDs, Mutual Funds, Customer data. Crucial, centralized reporting supporting Direct Marketing analysis, financial reporting, and regulatory compliance for a $3 billion Federal Savings Bank.

Major Responsibilities:

·         Involved in creating design documents, process flow diagrams and technical specs.

·         Managed a team of ETL developers and Database Administrators.

·         Established the ETL infrastructure, ETL/DW standards and ETL naming conventions.

·         Evaluating the latest versions and educating other developers with new changes.

·         Scheduling work for project team activities and gathering requirements from the business users.

·         Creates and maintains the overall and detailed project plan(s) and supervise the D/W ETL processes.

·         Involved in designing the D/W using Star Schema. Identifying the Fact, Dimension and slowly changing dimension tables.

·         Involved in the upgrade of information 5.x to 6.x

·         Taking ETL architecture decisions.

·         Involved in performing informatica Administrative tasks like creating login/passwords, taking backups, Registering repositories and restoring the repository.

·         Involved in designing the ETL processes and writing he design documents.

·         Created WorkFlows/Worklets and scheduled them using workflow manager and Unix.

·         Wrote numerous Pre and Post session SQL and Unix Scripts.

·         Coordinate with Oracle DBA and Unix Admin to achieve better dB performance and identify performance bottlenecks.

·         Worked on Power Mart/PowerCenter client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Repository Manager, Mapplet Designer and Transformations Developer.

 

Environment: Informatica PowerCenter 5.x/6.2/7.x/8.1, ETL, Cognos Impromtu administrator, Cognos Powerplay Transformer, Cognos Powerplay, Oracle 8i/9i, ER/Studio 3.5/4.0, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0

 

Merck & CO, West Point, PA July ’02 – Nov ‘03

ETL Tech Lead

Description:

Customer relationship management is a business strategy to help an enterprise to understand the needs of its customers. As a fundamental component of this strategy, the Merck Standard Customer Repository (MSCR) is built to provide the central repository for valid customer information for MERCK. MSCR also stores Alternate ID, address, license, credentials, corporate designation, associations information along with customer information. Data is populated into MSCR by external and internal loads. External data comes from various sources like AMA, OMS, AOA, IMS and CIMS.

Major Responsibilities:

Informatica:

·         Involved in designing Logical and physical databases for the staging and D/W using ERWIN.

·         Involved in design, analysis, implementation and support of ETL processes.

·         Prepared ETL standards, Naming conventions and wrote ETL flow documentation.

·         Responsible for the code drops and migrating processes into the production.

·         Involved in working with WorkFlow Manager, WorkFlow Monitor, Designer, Repository Manager, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

·         Involved in working with Business objects Supervisor, Report designer and Universe.

·         Used Striva/Powerconnect to access DB2 tables and data.

·         Created WorkFlows/Worklets and scheduled them using workflow manager and Unix.

·         Created various Reusable and Non-Reusable tasks like Session, Assignment, Command, Control, Decision, Event and Timer.

·         Wrote numerous Pre and Post session SQL and Unix Commands.

·         Create workflows to partition data using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions.

·         Integrated third party tools like Postal Certification (FirstLogic) and Vality (Integrity) with Informatica to validate addresses, zip codes and to dedup the addresses.

·         Worked on PowerMart/PowerCenter client tools like Designed and developed various kind of maps using transformations like _Expression, Aggregator, External Procedure, Stored Procedure, Look up, Filter, Joiner, Rank, Router, Update Strategy and XML.

·         Wrote store procedures, functions, database triggers and Unix shell scripts to support and Automate the ETL process.

·         Developing new Business Objects Universes and deploying to users worldwide.

·         Identifying the objects and classes need to be created in the universe.

·         Involved in describing, creating, building and maintaining the universe.

·         Involved in resolving loops, fan traps and chasm traps by creating Aliases and contexts.

·         Identifying and creating joins and cardinalities between tables.

·         Developing new Business Objects Universes to users worldwide.

·         Involved in performance and tuning the SQL queries using Explain plan.

Environment: Informatica Powermart 5.1/6.2, Informatica PowerCenter 5.x/6.2, ETL, Business Objects 5.1, Oracle 8i/9i, ER/Studio 3.5/4.0, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Windows NT 4.0.

 

National HealthCare Resources Inc /Concentra, NJ Jan’00 – June ‘02

Data Warehouse Consultant.

Description:

The project is to build a Sales data warehouse. The Objective is to extract data stored in the different databases such as DB2, SQL Server, Sybase and load finally into a single data warehouse repository, which is in Oracle database.

Major Responsibilities:

·         Involved in Installation and Configuration of Informatica PowerCenter, PowerMart, Informatica Client, Informatica Server.

·         Used Striva/Powerconnect to access DB2 tables and data.

·         Installed informatica server on both windows and Unix platforms.

·         Upgrading Informatica Power Center 1.7 to 5.0/5.1.

·         Trouble shooted connectivity problems. Looked up and read session, event and error logs for troubleshooting.

·         Taking Back ups and Restoring the Informatica Repository.

·         Worked on PowerMart/PowerCenter client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

·         Creating and managing Informatica Local and Global Repository.

·         Identifying and loading sources and targets metadata into repository.

·         Created, launched & scheduled sessions. Configured email notification.

·         Setting up Batches and sessions to schedule the loads at required frequency using Power Center Server manager. Generated completion messages and status reports using Server manager.

·         Extensively worked in the Performance Tuning of the programs, ETL Procedures and processes.

·         Creating User Groups, Users, folders and assigning permissions.

·         Extensively used ETL to load data from Oracle 8i, MS SQL Server 7.0 and flat files to Oracle 8i.

·         Used most of the transformations such as the Source qualifier, Aggregators, lookups, Filters, Sequence and Update strategy.

·         Extensively involved in UNIX shell scripting.

·         Created tables, Functions, Packages, Triggers, User defined data types.

·         Wrote PL/SQL Packages, Stored procedures, created triggers to implement business rules and validations.

·         Developing new Business Objects Universes and deploying to users worldwide.

·         Identifying Fact and Dimension tables to build indexes to improve performance.

·         Performance monitoring and tuning of the queries.

·         Created Universes for report generation using Designer module. As a part of Universe development, created Classes, different kind of objects, cardinalities.

·         Resolved the loops using Aliases and contexts.

·         Created Adhoc Reports using Business Objects. As a part of report development created the reports using universes as a main data providers and using the Powerful business objects functionality’s like @ functions, user response and formulae’s.

·         Export the Universe and documents to the Repository.

 

Environment: Informatica Powermart 4.7/5.0, Informatica PowerCenter 1.7/5.x, ETL, Business Objects, Oracle 8i, Sybase 12.0, SQL Server 7.0, MS SQL Server OLAP Services, ER/Studio 3.5/4.0, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Crystal Reports 7/8.5, Windows NT 4.0.

 

Office of Emergency Management NYC Mayor, New York Dec.’99 to Jan’00

Build a central Data warehouse where data coming from different sources like oracle, SQL server, DB2.

Major Responsibilities:

·         Involved in building Data Warehouse using Informatica.

·         Creating Informatica Repository using Informatica Power Center.

·         Identifying and loading sources and targets metadata into repository.

·         Creating mappings and different kind of transformations like Source qualifier, Aggregators, lookups, Filters, Sequence, Stored Procedure and Update strategy.

·         Worked with Informatica PowerMart client tools like Source Analyzer, Warehouse Designer, Mapping.

·         Created, launched & scheduled sessions and batches.

·         Involved in designing logical database using ERWIN.

·         Extensively used ETL to load data from Oracle 8i, MS SQL Server 7.0 and flat files to Oracle 8i.

·         Creating tables, views, indexes and writing Stored Procedures, Functions, Packages and Triggers using SQL

·         Creating and managing database objects with SQL Server 7.0.

 

Environment: Informatica Powermart 4.7, Informatica PowerCenter 1.7, ETL, Oracle 8.x, SQL Server 7.0, MS SQL Server 7.0/2000, Unix Shell Scripting, SQL, PL/SQL, TOAD, Transact-SQL, Crystal Reports 6.5/7, Windows NT 4.0.

 

Abilities Inc. Long Island, NY Apr ’99 to Nov’99

Worked on the complete life cycle of converting current CEI and Placement systems from FoxPro and Paradox to SQL Server/Oracle based system. This involved collecting consumer requirements, understanding the current system, and designing and implementation. The two systems were merged into one comprehensive system, thereby improving the performance and eliminating the redundant data.

Major Responsibilities:

·         Wrote Visual Basic Class Modules and procedures for different system module.

·         Designed and developed front end and COM objects in VB.

·         Design and developed ActiveX DLL for data security and performance in Visual Basic 6.0

·         Created Stored Procedures in SQL Server to add modularity and performance.

·         Used OLEDB and ADO to connect to SQL Server from VB.

·         Involved in data conversion from paradox to SQL Server.

·         Creating tables, views, indexes and writing Stored Procedures, Functions, Packages and Triggers using T- SQL/Enterprise Manager.

·         Wrote SQL Statements in SQL Server that retrieve and modify data.

Environment: Visual Basic 6.0, SQL Server 7.0/2000, ADO, COM, Crystal Reports 7.0/8.5, Visual Interdev, Paradox and FoxPro.

 

Software Solutions INC. NY, NY Jan ’99-Apr ’99

Web based Employee Human Resources System for Software Solutions Inc., having offices all over USA this website keeps track of all the employees, open/close requirements, time sheet and payroll.

 

Environment: ASP 2.0, Visual Basic 6.0, HTML, DHTML, SQL Server 6.5/7.0, Java Script, VB Script, ADO, COM, Crystal Reports 6.0/7.0, Visual Interdev, Front Page.

 

Sumac Technologies Apr 96 to Dec 98

System was developed for a printer sales company, which mainly was dealing with assembled computers. System covered printing Quotation, Printing Delivery notes, Billing and Inventory Management. Company was sending printers for demos and sales. System records the printers going for demos and the printers for sales. MIS facility was provided to know the printers on demo sold and damaged. System records salesmen sales details and helps management to assess salesman performance.

 

Environment: Visual Basic 4.0, Oracle 7.x, Crystal Reports.

 

Inventory System. Dec 95 to Apr 96

As Software Engineer was involved in system study, design and development. This product caters for sale, pricing, invoices and inventory. It records all information of buyer, vender, items, payments and warranties period. Produce invoices and reports of warranties, credits, payments, balances and accounts using Crystal Reports.

 

Environment: Visual Basic 4.0, Oracle 7.x, Crystal Reports.

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

ETL-Hadoop Architect/Developer

CTS

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Manager (Manager/Supervisor of Staff)

Work Status:

US - I am authorized to work in this country for my present employer only.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Hadoop Developer

Desired Job Type:

Intern

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-DE-Delaware
US-PA-Philadelphia

Relocate:

Yes

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

English

Fluent